Kullback-Leibler Approach to Gaussian Mixture Reduction
نویسندگان
چکیده
منابع مشابه
A Kullback-Leibler Approach to Gaussian Mixture Reduction
A common problem in multi-target tracking is to approximate a Gaussian mixture by one containing fewer components; similar problems can arise in integrated navigation. A common approach is successively to merge pairs of components, replacing the pair with a single Gaussian component whose moments up to second order match those of the merged pair. Salmond [1] and Williams [2], [3] have each prop...
متن کاملGaussian Mixture Reduction Using Reverse Kullback-Leibler Divergence
We propose a greedy mixture reduction algorithm which is capable of pruning mixture components as well as merging them based on the Kullback-Leibler divergence (KLD). The algorithm is distinct from the well-known Runnalls’ KLD based method since it is not restricted to merging operations. The capability of pruning (in addition to merging) gives the algorithm the ability of preserving the peaks ...
متن کاملGaussian Kullback-Leibler approximate inference
We investigate Gaussian Kullback-Leibler (G-KL) variational approximate inference techniques for Bayesian generalised linear models and various extensions. In particular we make the following novel contributions: sufficient conditions for which the G-KL objective is differentiable and convex are described; constrained parameterisations of Gaussian covariance that make G-KL methods fast and scal...
متن کاملGaussian Approximations of Small Noise Diffusions in Kullback-leibler Divergence
Abstract. We study Gaussian approximations to the distribution of a diffusion. The approximations are easy to compute: they are defined by two simple ordinary differential equations for the mean and the covariance. Time correlations can also be computed via solution of a linear stochastic differential equation. We show, using the Kullback-Leibler divergence, that the approximations are accurate...
متن کاملA Kullback-leibler Distance Approach to System Identification
The use of probability in system identification is shown to be equivalent to measuring Kullback-Leibler distance between the actual (empirical) and model distributions of data. When data are not known completely (being compressed, quantized, aggregated, missing etc.), the minimum distance approach can be seen as an asymptotic approximation of probabilistic inference. A class of problems is poin...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IEEE Transactions on Aerospace and Electronic Systems
سال: 2007
ISSN: 0018-9251
DOI: 10.1109/taes.2007.4383588